# Czech BERT
FERNET C5
FERNET-C5 is a Czech monolingual BERT base model pretrained on a 93GB large-scale cleaned Czech web corpus (C5).
Large Language Model
Transformers Other

F
fav-kky
219
7
Czert B Base Cased Long Zero Shot
Czert is a Czech language representation model based on BERT, specifically optimized for Czech and supports various downstream tasks.
Large Language Model
Transformers Other

C
UWB-AIR
18
2
Czert B Base Cased
CZERT is a language representation model specifically trained for Czech, outperforming multilingual BERT models on various Czech NLP tasks
Large Language Model
Transformers Other

C
UWB-AIR
560
3
Featured Recommended AI Models